Structured Composition of Semantic Vectors
نویسندگان
چکیده
Distributed models of semantics assume that word meanings can be discovered from “the company they keep.” Many such approaches learn semantics from large corpora, with each document considered to be unstructured bags of words, ignoring syntax and compositionality within a document. In contrast, this paper proposes a structured vectorial semantic framework, in which semantic vectors are defined and composed in syntactic context. As such, syntax and semantics are fully interactive; composition of semantic vectors necessarily produces a hypothetical syntactic parse. Evaluations show that using relationally-clustered headwords as a semantic space in this framework improves on a syntax-only model in perplexity and parsing accuracy.
منابع مشابه
Sense Contextualization in a Dependency-Based Compositional Distributional Model
Little attention has been paid to distributional compositional methods which employ syntactically structured vector models. As word vectors belonging to different syntactic categories have incompatible syntactic distributions, no trivial compositional operation can be applied to combine them into a new compositional vector. In this article, we generalize the method described by Erk and Padó (20...
متن کاملA Fuzzy Search Algorithm for Structured P2P Network Based on Multi-dimensional Semantic Matrix
Structured P2P network is highly efficient and low cost in resources search, but it only supports singlekeyword precise search rather than multi-keyword fuzzy search. This paper puts forward a novel search algorithm FSA-MDSM based on semantic vector matrix, in which the P2P node resources can be transferred into a number of semantic vectors which are formed into a vector matrix in terms of a fr...
متن کاملFuzzy multi-criteria decision making method based on fuzzy structured element with incomplete weight information
The fuzzy structured element (FSE) theory is a very useful toolfor dealing with fuzzy multi-criteria decision making (MCDM)problems by transforming the criterion value vectors of eachalternative into the corresponding criterion function vectors. Inthis paper, some concepts related to function vectors are firstdefined, such as the inner product of two function vectors, thecosine of the included ...
متن کاملUnsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model
Deep neural network (DNN) based natural language processing models rely on a word embedding matrix to transform raw words into vectors. Recently, a deep structured semantic model (DSSM) has been proposed to project raw text to a continuously-valued vector for Web Search. In this technical report, we propose learning word embedding using DSSM. We show that the DSSM trained on large body of text ...
متن کاملSOLUTION-SET INVARIANT MATRICES AND VECTORS IN FUZZY RELATION INEQUALITIES BASED ON MAX-AGGREGATION FUNCTION COMPOSITION
Fuzzy relation inequalities based on max-F composition are discussed, where F is a binary aggregation on [0,1]. For a fixed fuzzy relation inequalities system $ A circ^{F}textbf{x}leqtextbf{b}$, we characterize all matrices $ A^{'} $ For which the solution set of the system $ A^{' } circ^{F}textbf{x}leqtextbf{b}$ is the same as the original solution set. Similarly, for a fixed matrix $ A $, the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011